Tubular Neighbors for Regression and Classiication

نویسنده

  • Art B Owen
چکیده

The simple k nearest neighbor method is often very competitive, especially in classiication methods. When the number of predictors is large, the nearest neighbors are likely to be quite distant from the target point. Furthermore they tend to all be on one side of the target point. These are consequences of high dimensional geometry. This paper introduces a modiication of nearest neighbors that explicitly takes into account the extrapolation required in high dimensions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

2 Supervised Growing Cell

We present a new incremental radial basis function network suitable for classiication and regression problems. Center positions are continuously updated through soft competitive learning. The width of the radial basis functions is derived from the distance to topological neighbors. During the training the observed error is accumulated locally and used to determine where to insert the next unit....

متن کامل

Nearest Neighbor Classiication with a Local Asymmetrically Weighted Metric

This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classiication algorithm. It is shown both with theoretical arguments and computer experiments that good compression rates can be achieved outperforming the accuracy of the standard nearest neighbor classiication algorithm and obtaining almost the same accuracy as the k-NN algorithm with k optimised in each da...

متن کامل

Classiication with Learning K-nearest Neighbors

The nearest neighbor (NN) classiiers, especially the k-NN algorithm, are among the simplest and yet most eecient classiication rules and are widely used in practice. We introduce three adaptation rules that can be used in iterative training of a k-NN classiier. This is a novel approach both from the statistical pattern recognition and the supervised neural network learning points of view. The s...

متن کامل

On Weak Base Learners for Boosting Regression and Classiication on Weak Base Learners for Boosting Regression and Classiication

The most basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generate weak hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introducing a geometric concept called the an...

متن کامل

Degree of Bending (DoB) in Tubular KT-Joints of Jacket Structures Subjected to Axial Loads

The fatigue life of tubular joints commonly found in offshore industry is not only dependent on the value of Hot-spot stress (HSS), but is also significantly influenced by the through-the-thickness stress distribution characterized by the degree of bending (DoB). The determination of DoB values in a tubular joint is essential for improving the accuracy of fatigue life estimation using the stres...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999